home
***
CD-ROM
|
disk
|
FTP
|
other
***
search
/
Tech Arsenal 1
/
Tech Arsenal (Arsenal Computer).ISO
/
tek-01
/
neuron.zip
/
INTRO.DOC
< prev
next >
Wrap
Text File
|
1992-07-06
|
2KB
|
39 lines
****************************************************************************
WHAT IS THIS ABOUT ?
****************************************************************************
INTRODUCTION AND GENERALITIES
*****************************
I.1) Your computer is not perfect (a painful reality).
-----------------------------------------------------
A modern computer can be very powerful, efficient and quick for usual
computations. Compare it with the speed of a man (or woman !) and you'll laugh.
However, there are tasks that your computer can't cope with. I mean tasks where
logic doesn't take a big part; for a human it appears very easy to solve them.
In fact the structure of actual computers isn't adapted to those particular
problems. That is why some physicists, biologists and mathematicians have been
trying to imitate neurons and brain functions.
I.2) A short history.
---------------------
Even if neural networks models have been studied since the fourthies, a
real breakthrough has performed for the last 10 years. Let's see how it has
been done:
- 1943 Mc Culloch and Pitts study the first formal neuron model.
- 1949 Hebb writes down the first learning rule for teaching neural networks.
- 1954 An analogy with a magnetic spins system is made.
- 1961 Rosenblatt studies the 'Perceptron' (single layer model).
The Widrow-Hoff rule for learning.
- 1974 Multi-layers networks
- 1982 Hopfield studies totaly connected networks.
- 1985 Back-propagation algorithm is discovered.
- 1987 E.Gardner links statistical physics to networks.
Kohonen elaborates a self-organizated model.
(to read next: THEORY.DOC)